Stein Variational Gradient Descent with Multiple Kernels
نویسندگان
چکیده
Stein variational gradient descent (SVGD) and its variants have shown promising successes in approximate inference for complex distributions. In practice, we notice that the kernel used SVGD-based methods has a decisive effect on empirical performance. Radial basis function (RBF) with median heuristics is common choice previous approaches, but unfortunately this proven to be sub-optimal. Inspired by paradigm of Multiple Kernel Learning (MKL), our solution flaw using combination multiple kernels optimal kernel, rather than single one which may limit performance flexibility. Specifically, first extend Kernelized Discrepancy (KSD) view called (MKSD) then leverage MKSD construct general algorithm SVGD (MK-SVGD). Further, MKSVGD can automatically assign weight each without any other parameters, means method not only gets rid dependence also maintains computational efficiency. Experiments various tasks models demonstrate proposed consistently matches or outperforms competing methods.
منابع مشابه
Stein Variational Gradient Descent as Gradient Flow
Stein variational gradient descent (SVGD) is a deterministic sampling algorithm that iteratively transports a set of particles to approximate given distributions, based on a gradient-based update that guarantees to optimally decrease the KL divergence within a function space. This paper develops the first theoretical analysis on SVGD. We establish that the empirical measures of the SVGD samples...
متن کاملVAE Learning via Stein Variational Gradient Descent
A new method for learning variational autoencoders (VAEs) is developed, based on Stein variational gradient descent. A key advantage of this approach is that one need not make parametric assumptions about the form of the encoder distribution. Performance is further enhanced by integrating the proposed encoder with importance sampling. Excellent performance is demonstrated across multiple unsupe...
متن کاملStein Variational Gradient Descent: Theory and Applications
Although optimization can be done very efficiently using gradient-based optimization these days, Bayesian inference or probabilistic sampling has been considered to be much more difficult. Stein variational gradient descent (SVGD) is a new particle-based inference method derived using a functional gradient descent for minimizing KL divergence without explicit parametric assumptions. SVGD can be...
متن کاملLearning to Draw Samples with Amortized Stein Variational Gradient Descent
We propose a simple algorithm to train stochastic neural networks to draw samples from given target distributions for probabilistic inference. Our method is based on iteratively adjusting the neural network parameters so that the output changes along a Stein variational gradient direction (Liu & Wang, 2016) that maximally decreases the KL divergence with the target distribution. Our method work...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Cognitive Computation
سال: 2022
ISSN: ['1866-9964', '1866-9956']
DOI: https://doi.org/10.1007/s12559-022-10069-5